# Model Fusion
Deepseek R1T Chimera
MIT
DeepSeek-R1T-Chimera is an open-source weights model that combines the intelligence of DeepSeek-R1 with the token efficiency of DeepSeek-V3.
Large Language Model
Transformers

D
tngtech
491
158
L3 GothicMaid Upscaled 11B
An 8B-parameter language model fused using the mergekit tool, generated via Passthrough fusion method
Large Language Model
Transformers English

L
yamatazen
14
3
Cursa O1 7b V1.1
This is a pre-trained language model fused using the SLERP method, combining the strengths of both the pre-cursa-o1-v1.2 and post-cursa-o1 models.
Large Language Model
Transformers

C
marcuscedricridia
40
2
Biomistral MedMNX
BioMistral-MedMNX is a specialized language model in the biomedical field created by fusing multiple pre-trained models. It uses the DARE and TIES fusion methods to optimize performance.
Large Language Model
Transformers

B
BioMistral
3,509
4
Biomistral 7B SLERP
Apache-2.0
BioMistral-7B-slerp is a medical domain language model merged using the SLERP method from BioMistral-7B and Mistral-7B-Instruct-v0.1, specializing in biomedical text processing
Large Language Model
Transformers Supports Multiple Languages

B
BioMistral
84
6
Sirius 10B
Apache-2.0
Sirius-10B is a large language model formed by the fusion of the TurdusBeagle-7B and Severus-7B models
Large Language Model
Transformers

S
FelixChao
83
1
Blockchainlabs 7B Merged Test2 4
blockchainlabs_7B_merged_test2_4 is a 7B-parameter large language model created by merging mlabonne/NeuralBeagle14-7B and udkai/Turdus using the mergekit tool.
Large Language Model
Transformers

B
alnrg2arg
90
3
Featured Recommended AI Models